44 research outputs found

    IMAGE-2006 Mosaic: Geometric and Radiometric Consistency of Input Imagery

    Get PDF
    Within their domain of overlap, two images may differ in both geometry and radiometry. Consequently, when they are mosaiced, these differences may reveal the position of the seam lines even if they follow salient image structures such as roads and streams. A pair of overlapping images is said to be consistent if they are in agreement to one another in both geometry and radiometry. In this report, the consistency is measured using correlation computations and linear regressions. Measurements are produced for all existing pairs of overlapping images (given the 3,699 IMAGE-2006 input images, there are 29,447 such pairs). The quality layers of the IMAGE-2006 mosaics rely directly on these measurements. Indeed, the agreement between any pair of adjacent pieces of the mosaic is determined by the consistency measurements calculated within the domain of overlap of the two images leading to these two mosaic pieces.JRC.H.6-Digital Earth and Reference Dat

    Contents lists available at ScienceDirect Pattern Recognition

    Get PDF
    journal homepage: www.elsevier.com/locate/pr Edge-preserving smoothing using a similarity measure in adaptive geodesi

    Edge-preserving smoothing of high-resolution images with a partial multifractal reconstruction scheme

    Get PDF
    International audienceThe new generation of satellites leads to the arrival of very high-resolution images which offer a new quality of detailed information about the Earth's surface. However, the exploitation of such data becomes more complicated and less efficient as a consequence of the great heterogeneity of the objects displayed. In this paper, we address the problem of edge-preserving smoothing of high-resolution satellite images. We introduce a novel approach as a preprocessing step for feature extraction and/or image segmentation. The method is derived from the multifractal formalism proposed for image compression. This process consists in smoothing heterogeneous areas while preserving the main edges of the image. It is performed in 2 steps: 1) a multifractal decomposition scheme allows to extract the most informative subset of the image, which consists essentially in the edges of the objects; 2) a propagation scheme performed over this subset allows to reconstruct an approximation of the original image with a more uniform distribution of luminance. The strategy we adopt is described in the following and some results are presented for Spot acquisitions with spatial resolution of 20 x 20 m2

    Multifractal Pre-Processing of AVHRR Images to Improve the Determination of Smoke Plumes from Large Fires

    No full text
    In this study, we show how different spectral channels of NOAA-AVHRR acquired data can be used to produce a synthetized signal aimed at helping the characterization of plumes associated to fire events. The synthetized signal is computed using a reconstruction formula in the multifractal microcanonical formalism (herein referred to as MMF). The MMF is a recent development in the analysis of complex signals, well adapted to the study of turbulent acquired data, for instance geophysical fluids. It allows the computation, at each point of the signal’s domain, of a singularity exponent, characteristic of the scale behaviour of the signal around that point; singularity exponents provide information about the strengths of the transitions inside a signal, and they are related to the multifractal hierarchy associated to structure functions in Fully Developped Turbulence (FDT). In the MMF, it is possible to reconstruct a turbulent signal from the manifold of most singular exponents. We make use of this property by computing supergeometric structures from a thermal infrared channel in NOAA-AVHRR acquired data, and we use the signal’s gradient coming from other channels to reconstruct a signal in which plume pixels are easier to detect. This methodology is based on the turbulent properties of the plume accessible from the thermal infrared band; the algorithm is detailed and applied on a specific example, showing a new spatially-based method for helping the determination of plume pixels in NOAA-AVHRR data.JRC.H.6-Spatial data infrastructure

    Analyses multiéchelle et multifractale d'images météorologiques: Application à la détection de zones précipitantes

    No full text
    President: Jean-Pierre NADALRapporteurs: François-Xavier LE DIMET, Étienne MÉMINExaminateurs: Isabelle HERLIN, André SZANTAI, Antonio TURIEL, Hussein YAHIAWe are interested with thecharacterization, on meteorological infrared images, of convective clouds responsible for hard weather situations like rainfalls. The study of the statistical and physical properties of atmosphericalphenomena reveals a chaotic and turbulent behaviour, characterized by the scaleinvariance of some relevant quantities defined on the system. In this context, we use multiscale image processing tools derived from thermodynamical concepts that were initially introduced for turbulentdata analysis. First, a multifractal model allows us to detect the strongesttransitions in the signal and provides a hierarchical decomposition of the image. We show that the exhibited geometrical structures are related with theatmospherical mecanisms. Then, we propose to extend the multifractal formalism to extract thefoci of diffusion of the luminance in the image. We finally identify those foci ininfrared images with convective areas associated with rainfalls.Dans cette thèse, nous nous intéressons, à la caractérisation, sur des images météorologiques infrarouges, des systèmes convectifs susceptibles d'engendrer de fortes pluies. L'étude des propriétés statistiques des phénomènes observés révèle une évolution chaotique, mise en évidence par l'invariance d'échelle de certaines grandeurs significatives. Pour les étudier, nous introduisons des méthodes multiéchelles de traitement d'image dérivées de concepts thermodynamiques et qui constituent un prolongement des méthodes d'analyse de la turbulence. Nous utilisons tout d'abord un modèle multifractal afin de détecter les singularités du signal et d'extraire, dans une décomposition hiérarchique de l'image, des structures pertinentes pour la compréhension des mécanismes atmosphériques. Nous proposons ensuite une extension de ce modèle permettant d'exhiber les zones de diffusion de la luminance dans l'image et d'identifier les zones de convection associées aux précipitations

    gjacopo/multifractal: Final release of "multifractal" software tools

    No full text
    This is the final release of the "multifractal" software tools (C/Matlab) for multifractal analysis of 1D (time-series) and 2D (images) signals. Currently, no further development is planned

    Advances in Constrained Connectivity

    No full text
    The concept of constrained connectivity [Soille 2008, PAMI] is summarised. We then introduce a variety of measurements for characterising connected components generated by constrained connectivity relations. We also propose a weighted mean for estimating a representative value of each connected component. Finally, we define the notion of spurious connected components and investigate a variety of methods for suppressing them.JRC.H.6-Spatial data infrastructure

    Edge-Preserving Smoothing Using a Similarity Measure in Adaptive Geodesic Neighbourhoods

    No full text
    This paper introduces a novel image-dependent filtering approach derived from concepts known in mathematical morphology and aiming at edge-preserving smoothing natural images. Like other adaptive methods, it assumes that the neighbourhood of a pixel contains the essential information required for the estimation of local features. The proposed strategy does not require the definition of any spatial operator as it determines automatically, from the unfiltered input data, a weighting neighbourhood and a weighting kernel for each pixel location. It essentially consists in a weighted averaging combining both spatial and tonal information, for which a twofold similarity measure has to be calculated from local geodesic time functions. By designing relevant geodesic masks, two adaptive filtering algorithms are derived, that are particularly efficient at smoothing heterogeneous areas while preserving relevant structures in greyscale and multichannel images.JRC.DDG.H.6-Spatial data infrastructure

    Collaborative research-grade software for crowd-sourced data exploration: from context to practice - Part I: Guidelines for scientific evidence provision for policy support based on Big Data and open technologies

    No full text
    The scope and focus of the research reported in this document is to implement a collaborative high-level research-grade application software platform for scientific experimentation and data analysis. By doing so, we aim at exploring, extracting value from, and making sense of massive, interconnected datasets. Namely, the software is designed as an application layer that makes use of suitable statitiscal, exploratory or descriptive techniques, as well as visualisation tools, in order to produce reasonable interpretations of data – e.g. consisting in crowd-sourced data from social media, as well as other domain-orientated data, like sensor-based and geospatial data – that are logical but not definitive in their claims. We believe that by starting small and building quickly through a pilot, and by gaining experience from its deployment, it is possible to foster interdisciplinary and collaborative research that conjoins domain expertise. All together, it will lead to more holistic and extensive approach of entire complex systems. In order to consider all the potential issues and address all possible challenges in the future implementation, we adopt a multi-stage approach that aims to first acquire a clear vision of how to use data analysis and analytics, and thereafter this vision to the strategic needs of our research institution. Part I of the report provides the current Big Data and open technologies context (landscapes) in terms of European policies, states the motivation for our approach, and the foundations for an open, verifiable, reproducible, collaborative, and participatory framework for its deployment, and formulates applicable recommendations for implementation. Indeed, while it relies mainly on secondary literature – on Big Data, open technologies and data-driven decision making, as well as policy documents – this report actually defines a set of practical guidelines for the deployment and implementation of a Big Data application software solution in our institution.JRC.H.6-Digital Earth and Reference Dat

    Image Filtering Based on Locally Estimated Geodesic Functions

    No full text
    This paper addresses the problem of edge-preserving smoothing of natural images. A novel adaptive approach as a preprocessing stage in feature extraction and/or image segmentation. It performs a weighted convolution by combining both spatial and tonal information in a single similarity measure based on the local calculation of geodesic time functions. Two different strategies are derived for smoothing heterogeneous areas while preserving relevant structures.JRC.DDG.H.6-Spatial data infrastructure
    corecore